Research Article | Open Access
Volume 2023 |Article ID 0057 | https://doi.org/10.34133/plantphenomics.0057

Predicting and Visualizing Citrus Color Transformation Using a Deep Mask-Guided Generative Network

Zehan Bao,1,6 Weifu Li,1,2,6 Jun Chen,1,2 Hong Chen,1,2 Vijay John,3 Chi Xiao ,4 and Yaohui Chen 5

1College of Informatics, Huazhong Agricultural University, Wuhan 430070, China
2Engineering Research Center of Intelligent Technology for Agriculture, Ministry of Education, Wuhan, China
3RIKEN, Guardian robot project, 2-2-2 Hikaridai Seika-cho, Sorakugun, 619-0288 Kyoto, Japan
4Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou 570100, China
5College of Engineering, Huazhong Agricultural University, 430070 Wuhan, China
6These authors contributed equally to this work

Received 
21 Nov 2022
Accepted 
19 May 2023
Published
07 Jun 2023

Abstract

Citrus rind color is a good indicator of fruit development, and methods to monitor and predict color transformation therefore help the decisions of crop management practices and harvest schedules. This work presents the complete workflow to predict and visualize citrus color transformation in the orchard featuring high accuracy and fidelity. A total of 107 sample Navel oranges were observed during the color transformation period, resulting in a dataset containing 7,535 citrus images. A framework is proposed that integrates visual saliency into deep learning, and it consists of a segmentation network, a deep mask-guided generative network, and a loss network with manually designed loss functions. Moreover, the fusion of image features and temporal information enables one single model to predict the rind color at different time intervals, thus effectively shrinking the number of model parameters. The semantic segmentation network of the framework achieves the mean intersection over a union score of 0.9694, and the generative network obtains a peak signal-to-noise ratio of 30.01 and a mean local style loss score of 2.710, which indicate both high quality and similarity of the generated images and are also consistent with human perception. To ease the applications in the real world, the model is ported to an Android-based application for mobile devices. The methods can be readily expanded to other fruit crops with a color transformation period. The dataset and the source code are publicly available at GitHub.

© 2019-2023   Plant Phenomics. All rights Reserved.  ISSN 2643-6515.

Back to top